26,715 research outputs found

    The First Hundred Years of the Bureau of Labor Statistics

    Get PDF
    [Excerpt] This volume reports on the first century of a government agency whose founders hoped that, by publishing facts about economic conditions, the agency would help end strife between capital and labor. The Bureau\u27s early work included studies of depressions, tariffs, immigrants, and alcoholism and many assignments to investigate and mediate disputes between labor and management. Most of these functions- especially those involving formulation of policy- passed on to other agencies. The Bureau today remains one of the Nation\u27s principal economic factfinders. In writing the book, Drs. Goldberg and Moye had full freedom to interpret events in accordance with their judgments as historians, without conformance to an official view of institutional history. Given the perspective made possible by passing years, the authors offer broader evaluations of the Bureau\u27s early history than of contemporary events

    Modelling Citation Networks

    Full text link
    The distribution of the number of academic publications as a function of citation count for a given year is remarkably similar from year to year. We measure this similarity as a width of the distribution and find it to be approximately constant from year to year. We show that simple citation models fail to capture this behaviour. We then provide a simple three parameter citation network model using a mixture of local and global search processes which can reproduce the correct distribution over time. We use the citation network of papers from the hep-th section of arXiv to test our model. For this data, around 20% of citations use global information to reference recently published papers, while the remaining 80% are found using local searches. We note that this is consistent with other studies though our motivation is very different from previous work. Finally, we also find that the fluctuations in the size of an academic publication's bibliography is important for the model. This is not addressed in most models and needs further work.Comment: 29 pages, 22 figure

    Neoliberalizing Race

    Get PDF

    Hall effect encoding of brushless dc motors

    Get PDF
    Encoding mechanism integral to the motor and using the permanent magnets embedded in the rotor eliminates the need for external devices to encode information relating the position and velocity of the rotating member

    Recovering Grammar

    Get PDF
    Three major reasons have been proposed for why legal writing professors do not—or should not—teach grammar. First, the argument goes, teaching grammar would take valuable time away from more important, higher-order writing concerns. Second, some legal writing professors do not feel comfortable teaching grammar because, while they can certainly spot grammar problems in their students’ writing, they never learned technical grammar terms themselves. Third, legal writing professors steer clear of grammar because it is perceived to be associated with remedial writing and “mere” skills teaching—associations that further confine legal writing professors to a lower academic status than their clinical and doctrinal peers. In this article, I argue that a broader, rhetorical approach to grammar minimizes the negative associations with grammar teaching. I make the case that we shouldn’t divorce grammar from the “rest” of legal writing because grammar itself is rhetorical: necessary for and deeply tied to meaning-making and social practices. I contend that a rhetorical approach to grammar can actually enhance our field’s language-focused disciplinary identity. Moreover, I argue that a rhetorical approach to grammar will help ensure that students with diverse language practices feel included and supported, while at the same time providing all students with the linguistic-convention awareness that will allow them to write for successful legal practice. Ultimately, because grammar is foundational—constitutive of and integral to all other components of legal writing—I encourage legal writing professors to embrace grammar from a rhetorical perspective and center it as an important and intellectual part of the first-year legal writing course

    Computer-aided verification in mechanism design

    Full text link
    In mechanism design, the gold standard solution concepts are dominant strategy incentive compatibility and Bayesian incentive compatibility. These solution concepts relieve the (possibly unsophisticated) bidders from the need to engage in complicated strategizing. While incentive properties are simple to state, their proofs are specific to the mechanism and can be quite complex. This raises two concerns. From a practical perspective, checking a complex proof can be a tedious process, often requiring experts knowledgeable in mechanism design. Furthermore, from a modeling perspective, if unsophisticated agents are unconvinced of incentive properties, they may strategize in unpredictable ways. To address both concerns, we explore techniques from computer-aided verification to construct formal proofs of incentive properties. Because formal proofs can be automatically checked, agents do not need to manually check the properties, or even understand the proof. To demonstrate, we present the verification of a sophisticated mechanism: the generic reduction from Bayesian incentive compatible mechanism design to algorithm design given by Hartline, Kleinberg, and Malekian. This mechanism presents new challenges for formal verification, including essential use of randomness from both the execution of the mechanism and from the prior type distributions. As an immediate consequence, our work also formalizes Bayesian incentive compatibility for the entire family of mechanisms derived via this reduction. Finally, as an intermediate step in our formalization, we provide the first formal verification of incentive compatibility for the celebrated Vickrey-Clarke-Groves mechanism

    Canonical General Relativity on a Null Surface with Coordinate and Gauge Fixing

    Get PDF
    We use the canonical formalism developed together with David Robinson to st= udy the Einstein equations on a null surface. Coordinate and gauge conditions = are introduced to fix the triad and the coordinates on the null surface. Toget= her with the previously found constraints, these form a sufficient number of second class constraints so that the phase space is reduced to one pair of canonically conjugate variables: \Ac_2\and\Sc^2. The formalism is related to both the Bondi-Sachs and the Newman-Penrose methods of studying the gravitational field at null infinity. Asymptotic solutions in the vicinity of null infinity which exclude logarithmic behavior require the connection to fall off like 1/r31/r^3 after the Minkowski limit. This, of course, gives the previous results of Bondi-Sachs and Newman-Penrose. Introducing terms which fall off more slowly leads to logarithmic behavior which leaves null infinity intact, allows for meaningful gravitational radiation, but the peeling theorem does not extend to Ψ1\Psi_1 in the terminology of Newman-Penrose. The conclusions are in agreement with those of Chrusciel, MacCallum, and Singleton. This work was begun as a preliminary study of a reduced phase space for quantization of general relativity.Comment: magnification set; pagination improved; 20 pages, plain te
    • …
    corecore